A Proximal Alternating Direction Method for Semi-Definite Rank Minimization
نویسندگان
چکیده
Semi-definite rank minimization problems model a wide range of applications in both signal processing and machine learning fields. This class of problem is NP-hard in general. In this paper, we propose a proximal Alternating Direction Method (ADM) for the well-known semi-definite rank regularized minimization problem. Specifically, we first reformulate this NP-hard problem as an equivalent biconvex MPEC (Mathematical Program with Equilibrium Constraints), and then solve it using proximal ADM, which involves solving a sequence of structured convex semi-definite subproblems to find a desirable solution to the original rank regularized optimization problem. Moreover, based on the KurdykaŁojasiewicz inequality, we prove that the proposed method always converges to a KKT stationary point under mild conditions. We apply the proposed method to the widely studied and popular sensor network localization problem. Our extensive experiments demonstrate that the proposed algorithm outperforms state-of-the-art low-rank semi-definite minimization algorithms in terms of solution quality.
منابع مشابه
An inexact alternating direction method with SQP regularization for the structured variational inequalities
In this paper, we propose an inexact alternating direction method with square quadratic proximal (SQP) regularization for the structured variational inequalities. The predictor is obtained via solving SQP system approximately under significantly relaxed accuracy criterion and the new iterate is computed directly by an explicit formula derived from the original SQP method. Under appropriat...
متن کاملHankel Matrix Rank Minimization with Applications to System Identification and Realization
We introduce a flexible optimization framework for nuclear norm minimization of matrices with linear structure, including Hankel, Toeplitz and moment structures, and catalog applications from diverse fields under this framework. We discuss various first-order methods for solving the resulting optimization problem, including alternating direction methods of multipliers, proximal point algorithms...
متن کاملImproving an ADMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization
Abstract. The augmented Lagrangian method (ALM) is fundamental for solving convex minimization models with linear constraints. When the objective function is separable such that it can be represented as the sum of more than one function without coupled variables, various splitting versions of the ALM have been well studied in the literature such as the alternating direction method of multiplier...
متن کاملA Convergent 3-Block Semi-Proximal ADMM for Convex Minimization Problems with One Strongly Convex Block
In this paper, we present a semi-proximal alternating direction method of multipliers (ADMM) for solving 3-block separable convex minimization problems with the second block in the objective being a strongly convex function and one coupled linear equation constraint. By choosing the semi-proximal terms properly, we establish the global convergence of the proposed semi-proximal ADMM for the step...
متن کاملInexact alternating direction method based on Newton descent algorithm with application to Poisson image deblurring
The recovery of images from the observations that are degraded by a linear operator and further corrupted by Poisson noise is an important task in modern imaging applications such as astronomical and biomedical ones. Gradient-based regularizers involve the popular total variation semi-norm have become standard techniques for Poisson image restoration due to its edgepreserving ability. Various e...
متن کامل